On the convergence rate of lp-norm multiple kernel learning
نویسندگان
چکیده
We derive an upper bound on the local Rademacher complexity of lp-norm multiple kernel learning, which yields a tighter excess risk bound than global approaches. Previous local approaches analyzed the case p= 1 only while our analysis covers all cases 1≤ p≤ ∞, assuming the different feature mappings corresponding to the different kernels to be uncorrelated. We also show a lower bound that shows that the bound is tight, and derive consequences regarding excess loss, namely fast convergence rates of the order O(n− α 1+α ), where α is the minimum eigenvalue decay rate of the individual kernels.
منابع مشابه
The Local Rademacher Complexity of Lp-Norm Multiple Kernel Learning
We derive an upper bound on the local Rademacher complexity of lp-norm multiple kernel learning, which yields a tighter excess risk bound than global approaches. Previous local approaches aimed at analyzed the case p = 1 only while our analysis covers all cases 1 ≤ p ≤ ∞, assuming the different feature mappings corresponding to the different kernels to be uncorrelated. We also show a lower boun...
متن کاملMultiple Kernel Support Vector Regression with Higher Norm in Option Pricing
The purpose of present study is to investigate a nonparametric model that improves accuracy of option prices found by previous models. In this study option prices are calculated using multiple kernel Support Vector Regression with different norm values and their results are compared. L1norm multiple kernel learning Support Vector Regression (MKLSVR) has been successfully applied to option price...
متن کاملUnifying Framework for Fast Learning Rate of Non-Sparse Multiple Kernel Learning
In this paper, we give a new generalization error bound of Multiple Kernel Learning (MKL) for a general class of regularizations. Our main target in this paper is dense type regularizations including lp-MKL that imposes lp-mixed-norm regularization instead of l1-mixed-norm regularization. According to the recent numerical experiments, the sparse regularization does not necessarily show a good p...
متن کاملlp-Norm Multiple Kernel Learning
Learning linear combinations of multiple kernels is an appealing strategy when the right choice of features is unknown. Previous approaches to multiple kernel learning (MKL) promote sparse kernel combinations to support interpretability and scalability. Unfortunately, this l1-norm MKL is rarely observed to outperform trivial baselines in practical applications. To allow for robust kernel mixtur...
متن کاملA Multiple Kernel Learning Model Based on p-Norm
By utilizing kernel functions, support vector machines (SVMs) successfully solve the linearly inseparable problems. Subsequently, its applicable areas have been greatly extended. Using multiple kernels (MKs) to improve the SVM classification accuracy has been a hot topic in the SVM research society for several years. However, most MK learning (MKL) methods employ L1-norm constraint on the kerne...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Journal of Machine Learning Research
دوره 13 شماره
صفحات -
تاریخ انتشار 2012